Credibility of Observational Social Research (META-REP)

With 15 individual projects and over 50 participating scientists, META-REP investigates fundamental questions about replicability.

Katrin Auspurg is co-applicant and program committee member in the DGF-funded priority program "META-REP: A Meta-scientific Program to Analyze and Optimize Replicability in the Behavioral, Social, and Cognitive Sciences."

META-REP aims to analyze the replicability, robustness, and generalizability of scientific results. It focuses on three questions:

  • What exactly is considered "replication" in different scientific disciplines, and how replicable are published results?
  • Why do replication rates vary within and between disciplines?
  • How can replicability be improved in the long term?

By investigating these questions, the priority program makes a fundamental contribution to understanding replicability as a central quality criterion of empirical research.

A detailed description of the META-REP focus program can be found on the website of the Department of Psychology (LMU).

video player

If you click to view this video your personal data will be transmitted to YouTube and cookies may also be stored on your device. LMU has no influence over how any such data is transmitted or indeed over its further usage.

More information available here: LMU data protection policy, data protection policy from YouTube / Google

3:45 | 5 Aug 2025

Phase 1 (2022–2025)

“Enhancing Reproducibility and Robustness of Observational Social Science Research”

As part of the DFG's META-REP priority program, Prof. Auspurg and Dr. Schneck have successfully secured DFG funding for a sub-project on the robustness and sensitivity of results using non-experimental data in the social sciences. The goal of this project is to develop diagnostic tools (based in particular on computer-assisted statistical analyses such as "multiverse" or "multi-model" analyses) for the robustness of results with respect to different model and sample specifications.

In a second step, these tools will then be used to assess the robustness and thus the credibility of articles published in leading social science journals and/or based on data from the European Social Survey (ESS).

First, we will check whether the results of the articles can be reproduced using the authors' original code. In addition, this inventory will be linked to initial analyses of risk and incentive structures for more or less credible social science (based on theories from analytical sociology and quantitative science studies). Another project goal is to develop evidence-based proposals for tools to increase robust and credible results in the context of more transparent "open" science (including new publication formats).

Project number
464507200
Start
April 2022
End
March 2025
Lead
Prof. Dr. Katrin Auspurg
Dr. Andreas Schneck
Team
Daniel Krähmer
Laura Schächtele
Publications
Krähmer, D., Schächtele, L., & Schneck, A., (2023). Care to share? Experimental evidence on code sharing behavior in the social sciences. PLOS ONE, 18(8): e0289380. doi.org/10.1371/journal.pone.0289380

Phase 2 (2025–2028)

“Enhancing the Robustness of Observational Social Science Research by Computational Multi-Model Analyses”

The project conducts analyses on the reproducibility and robustness of articles that use the same"large-N" observational data (European Social Survey). However, the articles differ in terms of disciplines, journals, author constellations, and other factors that are likely to be associated with varying reproduction rates.

The sub-project proceeds in four steps:

  • During the first phase of the project, we (A) checked the accessibility of replication materials for approximately 1,200 articles in an "openness audit" (requesting authors to provide data and analysis code).
  • We then (B) conducted reproducibility analyses on a random subset of 100 articles (can the authors' results be reproduced when their code is applied to their data?).
  • For the second phase of the project, we plan (C) to carry out "correctness/congruence checks" for these articles (absence of coding errors, "congruence" between what is reported in the articles and what is actually done with the analysis code).
  • In the final step, we will focus on robustness analyses (do the results remain stable with seemingly marginal changes in data preparation or analysis, such as changes in weighting, imputation, or outlier treatment).

We aim to conduct a comprehensive audit of potential threats to transparent and credible research. Unlike existing audits, we take into account publications with both high and low impact. In addition, our audit enables comparisons between disciplines with varying degrees of implementation of the "FAIR principles" (findable, accessible, interoperable, reusable materials). This allows us to identify research areas where measures for improvement appear particularly appropriate and effective. The three closely related, overarching research goals for the second project phase are:

  1. Completion of work to analyze the extent of reproducible and robust results at the four levels of our audit (as a contribution to the META-REP "What");
  2. Analysis of conditions for varying reproducibility and robustness, including analysis of characteristics at the article, author, and journal levels (as a contribution to the META-REP "Why"); and
  3. Development of easily implementable measures (such as computer code) to improve reproducibility and robustness (as a contribution to the META-REP "How").

Project number
Start
April 2025
End
March 2028
Lead
Prof. Dr. Katrin Auspurg
Prof. Dr. Josef Brüderl
Team
Laura Schächtele
Daniel Krähmer
Publications